Occam and Compression Algorithms: General PAC Learning Results
نویسنده
چکیده
These notes are slightly edited from scribe notes in previous years. Please consult the handout of slide copies for definitions and theorem statements. Theorem 0 from Handout Let X be a domain of examples, and C, H concept classes over X. Let A be a learning algorithm such that ∀c ∈ C A takes a sample of m examples and outputs a hypothesis h ∈ H consistent with sample S. Then when using a sample of size m ≥ 1 ǫ ln |H| δ A is a PAC learning algorithm for C using H. Note that a similar result has already been proved in lecture 1, in the context of one of the examples.
منابع مشابه
Agnostic Learning and Structural Risk Minimization
In this lecture we address some limitations of the analysis of Occam algorithms that limit their applicability. We first discuss the case where the target concept c is not in H which is known as the non-realizable case and as agnostic PAC learning. We then turn to the case where the number of examples m is fixed (i.e., we cannot ask for more examples as in the standard PAC model) and consider h...
متن کاملPartial Occam's Razor and Its Applications
We introduce the notion of \partial Occam algorithm". A partial Oc-cam algorithm produces a succinct hypothesis that is partially consistent with given examples, where the proportion of consistent examples is a bit more than half. By using this new notion, we propose one approach for obtaining a PAC learning algorithm. First, as shown in this paper, a partial Occam algorithm is equivalent to a ...
متن کاملL Partial Occam's Razor and Its Applications Partial Occam's Razor and Its Applications
We introduce the notion of \partial Occam algorithm". A partial Occam algorithm produces a succinct hypothesis that is partially consistent with given examples, where the proportion of consistent examples is a bit more than half. By using this new notion, we propose one approach for obtaining a PAC learning algorithm. First, as shown in this paper, a partial Occam algorithm is equivalent to a w...
متن کاملExtending Occam's Razor
Occam's Razor states that, all other things being equal, the simpler of two possible hypotheses is to be preferred. A quanti ed version of Occam's Razor has been proven for the PAC model of learning, giving sample-complexity bounds for learning using what Blumer et al. call an Occam algorithm [1]. We prove an analog of this result for Haussler's more general learning model, which encompasses le...
متن کاملPAC-Bayes Risk Bounds for Stochastic Averages and Majority Votes of Sample-Compressed Classifiers
We propose a PAC-Bayes theorem for the sample-compression setting where each classifier is described by a compression subset of the training data and a message string of additional information. This setting, which is the appropriate one to describe many learning algorithms, strictly generalizes the usual data-independent setting where classifiers are represented only by data-independent message...
متن کامل